6 research outputs found

    Personal sensor wristband for smart infrastructure and control

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, February 2013."February 2013." Cataloged from PDF version of thesis.Includes bibliographical references (p. 67-72).Despite the rapid expansion of computers beyond desktop systems into devices and systems in the environment around us, the control interfaces to these systems are often basic and inadequate, particularly for infrastructure systems. WristQue is a wearable interface for interacting with computerized systems in the environment, providing both explicit remote control with buttons, touch, and gestural interfaces, and automatic closed-loop control using environmental sensors on the device, fused with precise indoor location for context. By placing these sensors and controls on the wrist, they are generally able to sense the environment unobstructed and are conveniently within reach at all times. WristQue is able to continuously collect and stream sensor data through a wireless network infrastructure, including temperature, humidity, activity, light, and color. A 9-DoF inertial/ magnetic measurement unit can be enabled to use the WristQue as a wrist-based gestural interface to nearby devices. Location and orientation data is used to implement a pointing interface that the user can use to indicate a device to control. This interface was implemented and tested using the WristQue and a commercial UWB localization system. The other sensors on the WristQue were validated by collecting several days of environmental data and conducting several controlled experiments. With these capabilities, the WristQue can be used in a number of sensing and control applications, such as lighting and comfort control.by Brian D. Mayton.S.M

    TRUSS: Tracking Risk with Ubiquitous Smart Sensing

    Get PDF
    We present TRUSS, or Tracking Risk with Ubiquitous Smart Sensing, a novel system that infers and renders safety context on construction sites by fusing data from wearable devices, distributed sensing infrastructure, and video. Wearables stream real-time levels of dangerous gases, dust, noise, light quality, altitude, and motion to base stations that synchronize the mobile devices, monitor the environment, and capture video. At the same time, low-power video collection and processing nodes track the workers as they move through the view of the cameras, identifying the tracks using information from the sensors. These processes together connect the context-mining wearable sensors to the video; information derived from the sensor data is used to highlight salient elements in the video stream. The augmented stream in turn provides users with better understanding of real-time risks, and supports informed decision-making. We tested our system in an initial deployment on an active construction site.Intel CorporationMassachusetts Institute of Technology. Media LaboratoryEni S.p.A. (Firm

    Random walk and lighting control

    Get PDF
    We pose the problem of turning off a single luminaire (or group) as an optimal stopping problem. We present the stationary and first-passage analysis of motion data obtained using custom wireless nodes in an open office floor plan. These calculations allow us to estimate the state of the network and calculate the probability and expected number of steps to visit a state from any arbitrary state. We also investigate if there is any evidence of clustering amongst the nodes by studying the covariance of the dataset. The data indicate the existence of clustering within the lattice. In other words, the analysis of random walk prevents luminaires from accidentally shutting off and dimensionality reduction determines the correct zoning of lighting via the occupants' movements

    Deep Learning Locally Trained Wildlife Sensing in Real Acoustic Wetland Environment

    No full text
    © 2019, Springer Nature Singapore Pte Ltd. We describe ‘Tidzam’, an application of deep learning that leverages a dense, multimodal sensor network installed at a large wetland restoration performed at Tidmarsh, a 600-acre former industrial-scale cranberry farm in Southern Massachusetts. Wildlife acoustic monitoring is a crucial metric during post-restoration evaluation of the processes, as well as a challenge in such a noisy outdoor environment. This article presents the entire Tidzam system, which has been designed in order to identify in real-time the ambient sounds of weather conditions as well as sonic events such as insects, small animals and local bird species from microphones deployed on the site. This experiment provides insight on the usage of deep learning technology in a real deployment. The originality of this work concerns the system’s ability to construct its own database from local audio sampling under the supervision of human visitors and bird experts

    DoppelLab: Tools for exploring and harnessing multimodal sensor network data

    No full text
    We present DoppelLab, an immersive sensor data browser built on a 3-d game engine. DoppelLab unifies independent sensor networks and data sources within the spatial framework of a building. Animated visualizations and sonifications serve as representations of real-time data within the virtual space.Intel CorporationMassachusetts Institute of Technology. Media LaboratoryThings That Think Consortiu

    The Networked Sensory Landscape: Capturing and Experiencing Ecological Change Across Scales

    No full text
    What role will ubiquitous sensing play in our understanding and experience of ecology in the future? What opportunities are created by weaving a continuously sampling, geographically dense web of sensors into the natural environment, from the ground up? In this article, we explore these questions holistically, and present our work on an environmental sensor network designed to support a diverse array of applications, interpretations, and artistic expressions, from primary ecological research to musical composition. Over the past four years, we have been incorporating our ubiquitous sensing framework into the design and implementation of a large-scale wetland restoration, creating a broad canvas for creative exploration at the landscape scale. The projects we present here span the development and wide deployment of custom sensor node hardware, novel web services for providing real-time sensor data to end user applications, public-facing user interfaces for open-ended exploration of the data, as well as more radical UI modalities, through unmanned aerial vehicles, virtual and augmented reality, and wearable devices for sensory augmentation. From this work, we distill the Networked Sensory Landscape, a vision for the intersection of ubiquitous computing and environmental restoration. Sensor network technologies and novel approaches to interaction promise to reshape presence, opening up sensorial connections to ecological processes across spatial and temporal scales
    corecore